The deterministic Lasso
نویسنده
چکیده
We study high-dimensional generalized linear models and empirical risk minimization using the Lasso. An oracle inequality is presented, under a so called compatibility condition. Our aim is three fold: to proof a result announced in van de Geer (2007), to provide a simple proof with simple constants, and to separate the stochastic problem from the deterministic one.
منابع مشابه
LASSO REGRESSION: ESTIMATION AND SHRINKAGE VIA LIMIT OF GIBBS SAMPLING By
The application of the lasso is espoused in high-dimensional settings where only a small number of the regression coefficients are believed to be nonzero (i.e., the solution is sparse). Moreover, statistical properties of high-dimensional lasso estimators are often proved under the assumption that the correlation between the predictors is bounded. In this vein, coordinatewise methods, the most ...
متن کاملExact LASSO Solutions for a Class of Constrained Cardinality Minimization Problems
This paper shows that the least absolute shrinkage and selection operator (LASSO) can provide an exact optimal solution to a special type of constrained cardinality minimization problem, which is motivated from a sensor network measurement robustness analysis problem. The constraint matrix of the considered problem is totally unimodular. This is shown to imply that LASSO leads to a tight linear...
متن کاملGenerating Probabilities From Numerical Weather Forecasts by Logistic Regression
Logistic models are studied as a tool to convert output from numerical weather forecasting systems (deterministic and ensemble) into probability forecasts for binary events. A logistic model obtains by putting the logarithmic odds ratio equal to a linear combination of the inputs. As any statistical model, logistic models will suffer from over-fitting if the number of inputs is comparable to th...
متن کاملRegression Performance of Group Lasso for Arbitrary Design Matrices
In many linear regression problems, explanatory variables are activated in groups or clusters; group lasso has been proposed for regression in such cases. This paper studies the nonasymptotic regression performance of group lasso using `1/`2 regularization for arbitrary (random or deterministic) design matrices. In particular, the paper establishes under a statistical prior on the set of nonzer...
متن کاملStochastic Methods for `1 Regularized Loss Minimization
We describe and analyze two stochastic methods for `1 regularized loss minimization problems, such as the Lasso. The first method updates the weight of a single feature at each iteration while the second method updates the entire weight vector but only uses a single training example at each iteration. In both methods, the choice of feature/example is uniformly at random. Our theoretical runtime...
متن کامل